GPU maker Nvidia says its H100 Tensor Core GPUs running in DGX H100 systems delivered the highest performance in every test of AI inferencing in the latest MLPerf benchmarking round.
Nvidia DGX Cloud gives enterprises immediate access to the infrastructure and software needed to train advanced models for generative AI and other applications.
Nvidia's next-generation H100 Tensor Core GPUs and Quantum-2 InfiniBand are now widely available, in Microsoft Azure and more than 50 partner systems from the company's partners including Asus, Atos, Dell Technologies, Gigabyte, HPE, Lenovo, and Supermicro.
Nvidia H100 GPUs set new records in all eight of the MLPerf Training benchmarks, while the A100 came top in the latest round of MLPerf HPC benchmarks.
Once again, GPU specialist Nvidia has used its GTC event to make several significant announcements.